# Trillion-token training
Poro 34B
Apache-2.0
Poro is a 34-billion-parameter multilingual large language model focused on Finnish, English, and code processing, open-sourced under Apache 2.0 license.
Large Language Model
Transformers Supports Multiple Languages

P
LumiOpen
1,908
116
Starcoderbase 1b
Openrail
StarCoderBase-1B is a 1-billion-parameter code generation model trained on over 80 programming languages, supporting fill-in-the-middle generation and multi-query attention mechanisms.
Large Language Model
Transformers Other

S
bigcode
12.79k
78
Open Llama 3b
Apache-2.0
OpenLLaMA is an open-source reproduction of Meta AI's LLaMA large language model, offering pretrained models with 3B, 7B, and 13B parameter scales
Large Language Model
Transformers

O
openlm-research
26.20k
157
Open Llama 7b
Apache-2.0
OpenLLaMA is an open-source reproduction of Meta AI's LLaMA large language model, offering pre-trained models with 3B, 7B, and 13B parameter scales
Large Language Model
Transformers

O
openlm-research
25.16k
130
Starcoderbase
Openrail
StarCoderBase is a large-scale code generation model with 15.5 billion parameters, trained on over 80 programming languages, supporting code completion and generation tasks.
Large Language Model
Transformers Other

S
bigcode
3,216
404
Featured Recommended AI Models